Efficient computational schemes for the orthogonal least squares algorithm

نویسندگان

  • Chng Eng Siong
  • Sheng Chen
  • Bernard Mulgrew
چکیده

The orthogonal least squares (OM) algorithm is an efficient implementation of the forward selection method for subset model selection. The ability to find good subset parameters with only a linearly increasing computational requirement makes this method attractive lor practical implementations. In this correspondence, we examine the computational complexity of the algorithm and present a preprocessing inethod for reducing the computational requirement.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fast orthogonal least squares algorithm for efficient subset model selection

An efficient implementation of the orthogonal least squares algorithm for subset model selection is derived in this correspondence. Computational complexity of the algorithm is examined and the result shows that this new fast orthogonal least squares algorithm significantly reduces computational requirements.

متن کامل

Superlinearly convergent exact penalty projected structured Hessian updating schemes for constrained nonlinear least squares: asymptotic analysis

We present a structured algorithm for solving constrained nonlinear least squares problems, and establish its local two-step Q-superlinear convergence. The approach is based on an adaptive structured scheme due to Mahdavi-Amiri and Bartels of the exact penalty method of Coleman and Conn for nonlinearly constrained optimization problems. The structured adaptation also makes use of the ideas of N...

متن کامل

Convergence of an efficient local least-squares fitting method for bases with compact support

The least-squares projection procedure appears frequently in mathematics, science, and engineering. It possesses the well-known property that a least-squares approximation (formed via orthogonal projection) to a given data set provides an optimal fit in the chosen norm. The orthogonal projection of the data onto a finite basis is typically approached by the inversion of a Gram matrix involving ...

متن کامل

Robust Identification for Linear-in-the-parameters Models

In this paper new robust nonlinear model construction algorithms for a large class of linear-in-the-parameters models are introduced to enhance model robustness, including three algorithms using combined Aor D-optimality or PRESS statistic (Predicted REsidual Sum of Squares) with regularised orthogonal least squares algorithm respectively. A common characteristic of these algorithms is that the...

متن کامل

Efficient Sparse Recovery Pursuits with Least Squares

We present a new greedy strategy, with an efficient implementation technique, that enjoys similar computational complexity and stopping criteria like OMP. Moreover, its recovery performance in the noise free and the Gaussian noise cases is comparable and in many cases better than other existing sparse recovery algorithms both with respect to the theoretical and empirical reconstruction ability....

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • IEEE Trans. Signal Processing

دوره 43  شماره 

صفحات  -

تاریخ انتشار 1995